perm filename COMPUT.NS[1,JMC]1 blob sn#632494 filedate 1981-12-26 generic text, type T, neo UTF8
n533  0144  26 Dec 81
BC-COMPUTE-2dadd-12-26
    X X X BUCK ROGERS TYPEWRITER.''
    But the computer age had been launched. And during the next three
decades, as computer-builders made one quantum jump in design after
the next, the machines would grow increasingly smaller, faster and
more energy-efficient.
    The key to their rapid evolution lay in their very essence; it lay
in the fact that computers operate on the binary number system, the
two numbers of which (one or zero) are represented by switches that
can be turned to either the on or the off position.
    The speed of the computer, then, depends, at least in part, on how
quickly you can open and close the switches: quicker switches mean
quicker computers.
    As the machines evolved, their designers went from primitive
mechanical switches (''relays'') to faster vacuum tubes, to even
faster transistors - and, finally, to today's ''integrated
circuits,'' which are actually platforms deisgned to carry entire
arrays of switches in combinations that accelerate the onoff process
enormously.
    Along the evolutionary road, the primitive ENIAC shrank to the size
of today's average home computer, a device which weighs less than 30
pounds and uses about one-thousandth the electrical energy consumed
by the original electronic granddaddy.
    (The evolution continues today, of course. ''They're getting
amazingly complex,'' explains Dr. O'Rourke. ''In some cases, in fact,
they're butting up against fundamental limits - such as the speed of
light, or the speed of electricity in the wires. You can jam the
circuits closer and closer together, but then you start to get a
problem with heat. Probably the ultimate would be a computer of about
one cubic inch, which will be more powerful than the things that once
filled up an entire room, and it would be in a bath of liquid helium,
perhaps, so that it's super-cooled and superconductant. I understand
that IBM is already working toward this.'')
    As the machines grew smaller, faster and less expensive, of course,
they began to invade every phase of American life. A single statistic
demonstrates the extent of that invasion: in the late 1950s, as the
second-generation, transistor-based computers were just taking off,
Americans were spending about $1 billion annually on data processing
equipment; by 1980, they were spending more than $55 billion a year
on the same kinds of digital gear.
    (In 1980, jumbo-sized IBM earned 63 percent of the dollars spent on
general-purpose, American-made computers; seven other firms -
Honeywell, Sperry-Univac, Burroughs, NCR, Control Data, GE and RCA
(also known in the trade as ''The Seven Dwarfs'') divided the rest.)
    Today computers can be found at work in - and often in command of -
many areas of American life. Small microprocessors, for example,
control key components in washing-drying machines, in automobiles,
calculators, watches, stereos, traffic signals, and even children's
toys.
    Video games provide another example of the extent of the boom: only
a decade after the invention of ''Pong,'' the first computerized
television game, more than four million Americans now own the
programmable video display terminals on which the computer contests
are waged. And the video-computer apotheosis now appears to be in
sight: according to recent reports, the McDonald's hamburger chain
has begun negotiating with Atari for a video system, eventually to be
installed in every restaurant, in which a machine will first take
your meal order (you'll punch it in on the keyboard) - and then play
a videogame with you while you wait for your burgers and fries.
    In industry, the move to computer technology has been just as
pervasive and just as sudden. While General Motors and General
Electric continue to explore better techniques for computerizing
assembly lines and other manufacturing processes, Japanese ''robot
factories'' (in one of them, 10,000 computer-directed robots now work
24 hours a day, and without coffee breaks) are rapidly becoming a
commonplace in that country's efficiency-minded economy.
    The list of benefits provided by these new applications of computer
technology seems almost endless.
    But so does the list of problems which accompanies it.
    
    The Computer Scientist
    
    He sits with his feet up.
    This is a Monday afternoon on the campus of The Johns Hopkins
University, and Computer Science Professor Joe O'Rourke turns out to
be a very young-looking, 30-year-old man in an ordinary plaid
workshirt who wears big eyeglasses, trim slacks, and curly,
close-cropped hair. Dr. O'Rourke looks neat and organized and very
clean. But he also looks very relaxed, easygoing - not at all like
the stiff, rigid-faced zombie of the computer programmer cliches.
    ''It's all very hard to predict,'' says Joe O'Rourke, who attended
an eastern liberal arts college before earning a Ph.D. in computer
science at the University of Pennsylvania. He speaks softly, in a
high-pitched but genial voice; he laughs a great deal. He points out
that he's married to a historian; he also points out that his brother
is a dedicated sculptor - but a sculptor who these days often
executes his art works on the video display computer terminals that
Joe taught him how to use.
    (MORE)
    
nyt-12-26-81 0444est
***************